Chris Pollett > Old Classses >
CS156

( Print View )

Student Corner:
  [Grades Sec1]

  [Submit Sec1]

  [
Lecture Notes]

  [Discussion Board]

Course Info:
  [Texts & Links]
  [Topics/Outcomes]
  [Outcomes Matrix]
  [Grading]
  [HW/Quiz Info]
  [Exam Info]
  [Regrades]
  [Honesty]
  [Additional Policies]
  [Announcements]

HWs and Quizzes:
  [Hw1]  [Hw2]  [Hw3]
  [Hw4]  [Hw5]  [Quizzes]

Practice Exams:
  [Mid]  [Final]

                           












CS156 Fall 2014Practice Final

To study for the final I would suggest you: (1) Know how to do (by heart) all the practice problems. (2) Go over your notes at least three times. Second and third time try to see how much you can remember from the first time. (3) Go over the homework problems. (4) Try to create your own problems similar to the ones I have given and solve them. (5) Skim the relevant sections from the book. (6) If you want to study in groups, at this point you are ready to quiz each other. The practice final is below. Here are some facts about the actual final: (a) It is comprehensive (b) It is closed book, closed notes. Nothing will be permitted on your desk except your pen (pencil) and test. (c) You should bring photo ID. (d) There will be more than one version of the test. Each version will be of comparable difficulty. (e) It is 10 problems, 6 problems will be on material since the lecture before the midterm, four problems will come from the topics covered prior to the midterm. (f) Two problems will be exactly (less typos) off of the practice final, and one will be off of practice midterm.

  1. Mod3(x_1, ..., x_n) is the propositional formula which returns true if the number of variables `x_i` which are true in a truth assignment is exactly 0 mod 3. Write down a CNF formula for Mod3 in the case where `n=6`.
  2. Give the DPLL algorithm and explain each of the three main "shortcut" it checks for.
  3. (a) Let `x:= f(z)` and `y:= g(w)` explain how the unification algorithm from class would work on these inputs. (b) Now suppose `x:= [g(v), f(g(z))]` and `y:= [g(f(w)), f(w)]`. Explain how the unification algorithm from class would work on these inputs.
  4. Consider the problem where you have two socks and two shoes all of which are on the ground. You also have two feet. Your goal is to put on your shoes. Your feet can wear socks, but not shoes directly. Your available actions are to put on socks and put on shoes. Formulate this problem reasonably in PDDL. Then give an example plan solving it.
  5. Show how the Graphplan algorithm would work on the example of the previous problem.
  6. Define the following terms related to knowledge engineering: (a) ontology, (b) reification, (c) taxonomy.
  7. Explain and give an example of the following concepts from probability theory: (a) random variable, (b) marginalization, (c) Baye's rule.
  8. Consider the following training set of 4-tuples.
    (T,T,T, F)
    (T,T,F, F)
    (T,F,T, F)
    (T,F,F, F)
    (F,T,T, T)
    
    Here `T` is short for true, `F` is short for false. The first three columns correspond to the variables `x_1`, `x_2`, `x_3`, the last column is the output of some function `f`. Calculate `Gai\n(x_i)` for `i=1,2,3`. Which variable should we use as the top of a decision tree for `f`?
  9. Give the formal definition of perceptron. Explain and give an example of a feed forward network is and what a recurrent network is.
  10. Give and explain the update rule for learning neuron weights from class.